20 Years Supercomputer Market Analysis
نویسنده
چکیده
Since the very beginning of the International Supercomputer Conference (ISC) series, one important focus has been the analysis of the supercomputer market. For 20 years, statistics about this marketplace have been published at ISC. Initially these were based on simple market surveys and since 1993, they are based on the TOP500 project, which has become the accepted standard for such data. We take the occasion of the 20 anniversary of ISC to combine and extend several previously published articles based on these data. We analyze our methodologies for collecting our statistics and illustrate the major developments, trends and changes in the High Performance Computing (HPC) marketplace and the supercomputer market since the introduction of the Cray 1 system. The introduction of vector computers started the area of modern ‘Supercomputing’. The initial success of vector computers in the seventies and early eighties was driven by raw performance. In the second half of the eighties, the availability of standard development environments and of application software packages became more important. Next to performance, these criteria determined the success of MP vector systems especially at industrial customers. Massive Parallel Systems (MPP) became successful in the early nineties due to their better price/performance ratios, which was enabled by the attack of the ‘killer-micros’. In the lower and medium segments of the market MPPs were replaced by microprocessor based symmetrical multiprocessor systems (SMP) in the middle of the nineties. Towards the end of the nineties only the companies which had entered the emerging markets for massive parallel database servers and financial applications attracted enough business volume to be able to support the hardware development for the numerical high end computing market as well. Success in the traditional floating-point intensive engineering applications was no longer sufficient for survival in the market. The success of microprocessor based SMP concepts even for the very high-end systems, was the basis for the emerging cluster concepts in the early 2000s. Within the first half of this decade, clusters of PC’s and workstations have become the prevalent architecture for many HPC application areas in the TOP500 on all ranges of performance. However, the success of the Earth Simulator vector system demonstrated that many scientific applications could benefit greatly from other computer architectures. At the same time, there is renewed broad interest in the scientific HPC community for new hardware architectures and new programming paradigms. The IBM BlueGene/L system is one early example of a shifting design focus for large-scale system. Built with low performance but very low power components, it allows a tight integration of an unprecedented number of processors to achieve surprising performance levels for suitable applications. The DARPA HPCS program has the declared goal of building a PetaFlops computer by the end of the decade using novel computer architectures.
منابع مشابه
Sponsored by : IBM Richard Walsh Steve Conway
Fifteen years ago, the high-performance computing (HPC) market started to abandon its data-parallel, vector architectural lineage and turned to commodity-priced scalar processors. One by one, the other custom components of HPC systems have been pushed aside in favor of cheaper, standards-based alternatives. With some notable exceptions, most HPC system component technologies have been mainstrea...
متن کاملSpatial analysis on the state of the housing market in Tehran
Like other economic markets, housing market is comprised of supply and demand dimensions and equality of supply and demand makes the housing market balanced. The aim of this paper is to investigate the characteristics of housing market of Tehran city and some deficiencies of this market with fundamental – applied methodology. The variable studied here is the saleable residential unit with minim...
متن کاملVisitor programmes HPC facilities Consultancy Research software training 20 years of EPCC
2 ePCC at sC’10 3 PlanetHPC: the roadmap for HPC in europe Faster simulation codes on HeCtoR 4 ePCC visitor programmes 6 teXt: towards eXaflop applications PRaCe 7 HeCtoR: the journey to multi-core 8 sPRInt: speeding up research analysis 9 supercomputing liquid crystal superstructures 10 msc in High Performance Computing 11 FPGa researcher wins Industrial Fellowship all Hands 2010 12 Celebratin...
متن کاملA Welfare Analysis of Wheat Self-Sufficiency Policy and the Influence on the Barley Market in Iran: A Game Theory Approach
ABSTRACT- Iran achieved its self-sufficiency goal in wheat production a few years ago, perhaps at the expense of decreasing the production of other grains specially barley as stated by critics in the country. Considering the dependency of wheat and barley markets on each other, policy preference functions were estimated separately for each market. Incorporating political weights, a game theory ...
متن کاملHigh-Performance Computing in Industry
In 1993, a list of the top 500 supercomputer sites worldwide was made available for the rst time. Since then, the Top500 list has been published twice a year. The list allows a detailed and well-founded analysis of the state of high-performance computing (HPC). This article summarizes the recent trends in application areas of HPC systems, focusing on the increase in industrial installations and...
متن کامل